Search results for "neural computation"

showing 10 items of 11 documents

2019

As rats learn to search for multiple sources of food or water in a complex environment, they generate increasingly efficient trajectories between reward sites. Such spatial navigation capacity involves the replay of hippocampal place-cells during awake states, generating small sequences of spatially related place-cell activity that we call "snippets". These snippets occur primarily during sharp-wave-ripples (SWRs). Here we focus on the role of such replay events, as the animal is learning a traveling salesperson task (TSP) across multiple trials. We hypothesize that snippet replay generates synthetic data that can substantially expand and restructure the experience available and make learni…

0301 basic medicineComputer sciencePlace cellMachine learningcomputer.software_genreSpatial memorySynthetic data03 medical and health sciencesCellular and Molecular Neuroscience0302 clinical medicineModels of neural computationGeneticsReinforcement learningMolecular BiologyEcology Evolution Behavior and SystematicsEcologybusiness.industryReservoir computingSnippet030104 developmental biologyComputational Theory and MathematicsModeling and SimulationSequence learningArtificial intelligencebusinesscomputer030217 neurology & neurosurgeryPLOS Computational Biology
researchProduct

A stable brain from unstable components: Emerging concepts and implications for neural computation.

2017

Neuroscientists have often described the adult brain in similar terms to an electronic circuit board- dependent on fixed, precise connectivity. However, with the advent of technologies allowing chronic measurements of neural structure and function, the emerging picture is that neural networks undergo significant remodeling over multiple timescales, even in the absence of experimenter-induced learning or sensory perturbation. Here, we attempt to reconcile the parallel observations that critical brain functions are stably maintained, while synapse- and single-cell properties appear to be reformatted regularly throughout adult life. In this review, we discuss experimental evidence at multiple …

0301 basic medicineNeuronsArtificial neural networkGeneral NeuroscienceComputationModels NeurologicalBrainSensory systemSynapse03 medical and health sciences030104 developmental biology0302 clinical medicineModels of neural computationBiological neural networkAnimalsHumansNeural Networks ComputerPsychologyNeuroscience030217 neurology & neurosurgeryDynamic equilibriumElectronic circuitNeuroscience
researchProduct

Relation between fixation disparity and the asymmetry between convergent and divergent disparity step responses

2007

Abstract The neural network model of Patel et al. [Patel, S. S., Jiang, B. C., & Ogmen, H. (2001). Vergence dynamics predict fixation disparity. Neural Computation, 13 (7), 1495–1525] predicts that fixation disparity, the vergence error for a stationary fusion stimulus, is the result of asymmetrical dynamic properties of disparity vergence mechanisms: faster (slower) convergent than divergent responses give rise to an eso (exo) fixation disparity, i.e., over-convergence (under-convergence) in stationary fixation. This hypothesis was tested in the present study with an inter-individual approach: in 16 subjects we estimated the vergence step response to a 1 deg disparity stimulus with a subje…

AdultVision Disparitymedia_common.quotation_subjectModels NeurologicalFixation OcularStimulus (physiology)AsymmetryDivergencelaw.inventionModels of neural computationOpticslawHumansmedia_commonMathematicsVision Binocularbusiness.industryMathematical analysisConvergence OcularNoniusSensory SystemsOphthalmologyConvergent and divergent productionNonius linesBinocular visionConvergenceFixation disparitybusinessBinocular visionPhotic StimulationVision Research
researchProduct

Why Cortices ? Neural Computation in the Vertebrate Visual System

1989

We propose three high level structural principles of neural networks in the vertebrate visual cortex and discuss some of their computational implications for early vision: a) Lamination, average axonal and dendritic domains, and intrinsic feedback determine the spatio-temporal interactions in cortical processing. Possible applications of the resulting filters include continuous motion perception and the direct measurement of high-level parameters of image flow, b) Retinotopic mapping is an emergent property of massively parallel connections. With a local intrinsic operation in the target area, mapping combines to a space-variant image processing system as would be useful in the analysis of …

Artificial neural networkComputer sciencebusiness.industryProperty (programming)Optical flowPattern recognitionImage processingVisual cortexmedicine.anatomical_structureModels of neural computationmedicineMotion perceptionArtificial intelligencebusinessMassively parallel
researchProduct

Why Cortices? Neural Networks for Visual Information Processing

1989

Neural networks for the processing of sensory information show remarkable similarities between different species and across different sensory modalities. As an example, cortical organization found in the mamalian neopallium and in the optic tecta of most vertebrates appears to be equally appropriate as a substrate for visual, auditory, and somatosensory information processing. In this paper, we formulate three structural principles of the vertebrate visual cortex that allow to analyze structure and function of these neural networks on an intermediate level of complexity. Computational applications are taken from the field of early vision. The proposed principles are: (a) Average anatomy, i …

Artificial neural networkbusiness.industryComputer scienceOptical flowPattern recognitionSensory systemImage processingModels of neural computationVisual cortexmedicine.anatomical_structureReceptive fieldmedicineArtificial intelligenceMotion perceptionbusinessNeuroscience
researchProduct

The Stability-Plasticity Dilemma: Investigating the Continuum from Catastrophic Forgetting to Age-Limited Learning Effects

2013

The stability-plasticity dilemma is a well-know constraint for artificial and biological neural systems. The basic idea is that learning in a parallel and distributed system requires plasticity for the integration of new knowledge, but also stability in order to prevent the forgetting of previous knowledge. Too much plasticity will result in previously encoded data being constantly forgotten, whereas too much stability will impede the efficient coding of this data at the level of the synapses. However, for the most part, neural computation has addressed the problems related to excessive plasticity or excessive stability as two different fields in the literature.

Computer sciencelcsh:BF1-990Catastrophic Forgetting02 engineering and technologyPlasticity050105 experimental psychologyPsycholinguisticsLearning effectModels of neural computationConnectionismneural computation0202 electrical engineering electronic engineering information engineeringPsychology0501 psychology and cognitive sciencesGeneral PsychologyComputingMilieux_MISCELLANEOUSCognitive scienceForgettingPsycholinguisticsParallel Distributed Processingbusiness.industryAge of Acquisition05 social sciencesOpinion ArticleDilemmalcsh:Psychology[ SDV.NEU ] Life Sciences [q-bio]/Neurons and Cognition [q-bio.NC]020201 artificial intelligence & image processing[SDV.NEU]Life Sciences [q-bio]/Neurons and Cognition [q-bio.NC]Artificial intelligencebusinessCoding (social sciences)Frontiers in Psychology
researchProduct

A Survey of Continuous-Time Computation Theory

1997

Motivated partly by the resurgence of neural computation research, and partly by advances in device technology, there has been a recent increase of interest in analog, continuous-time computation. However, while special-case algorithms and devices are being developed, relatively little work exists on the general theory of continuous- time models of computation. In this paper, we survey the existing models and results in this area, and point to some of the open research questions. Final Draft peerReviewed

Discrete mathematicsTheoretical computer scienceComputabilityComputationModel of computationneuraalilaskentaTuring machineTuring machinesymbols.namesakeModels of neural computationComputable functionOpen researchTheory of computationsymbolsHopfield networkcellular automatondifferential analyzerMathematics
researchProduct

Coarse scales are sufficient for efficient categorization of emotional facial expressions: Evidence from neural computation

2010

The human perceptual system performs rapid processing within the early visual system: low spatial frequency information is processed rapidly through magnocellular layers, whereas the parvocellular layers process all the spatial frequencies more slowly. The purpose of the present paper is to test the usefulness of low spatial frequency (LSF) information compared to high spatial frequency (HSF) and broad spatial frequency (BSF) visual stimuli in a classification task of emotional facial expressions (EFE) by artificial neural networks. The connectionist modeling results show that an LSF information provided by the frequency domain is sufficient for a distributed neural network to correctly cla…

Facial expressionVisual perceptionArtificial neural networkComputer sciencebusiness.industryCognitive NeurosciencePattern recognitionCognitive neuroscienceComputer Science ApplicationsPerceptual systemModels of neural computationConnectionismArtificial IntelligenceParvocellular cellFrequency domainComputer visionArtificial intelligencebusinessNeurocomputing
researchProduct

Exponential Transients in Continuous-Time Symmetric Hopfield Nets

2001

We establish a fundamental result in the theory of continuous-time neural computation, by showing that so called continuous-time symmetric Hopfield nets, whose asymptotic convergence is always guaranteed by the existence of a Liapunov function may, in the worst case, possess a transient period that is exponential in the network size. The result stands in contrast to e.g. the use of such network models in combinatorial optimization applications. peerReviewed

Lyapunov functionHopfield netsstabilityneural networksExponential functionHopfield networksymbols.namesakeModels of neural computationRecurrent neural networkConvergence (routing)symbolsApplied mathematicsCombinatorial optimizationdynaamiset systeemitAlgorithmMathematicsNetwork model
researchProduct

Psychophysically Tuned Divisive Normalization Approximately Factorizes the PDF of Natural Images

2010

The conventional approach in computational neuroscience in favor of the efficient coding hypothesis goes from image statistics to perception. It has been argued that the behavior of the early stages of biological visual processing (e.g., spatial frequency analyzers and their nonlinearities) may be obtained from image samples and the efficient coding hypothesis using no psychophysical or physiological information. In this work we address the same issue in the opposite direction: from perception to image statistics. We show that psychophysically fitted image representation in V1 has appealing statistical properties, for example, approximate PDF factorization and substantial mutual informatio…

NeuronsComputational neurosciencebusiness.industryCognitive Neurosciencemedia_common.quotation_subjectModels NeurologicalNormalization (image processing)Pattern recognitionMutual informationInformation theoryMachine learningcomputer.software_genreVisual processingModels of neural computationArts and Humanities (miscellaneous)PerceptionVisual PerceptionArtificial intelligenceEfficient coding hypothesisbusinesscomputerVisual Cortexmedia_commonMathematicsNeural Computation
researchProduct